The separating hyperplane of traditional support vector machines is sensitive to noises and outliers 摘要傳統的支持向量機分類超平面對噪聲和野值非常敏感。
Svm maps input vectors nonlinearly into a high dimensional feature space and constructs the optimum separating hyperplane in the spade to realize modulation recognition 支撐矢量機把各個識別特征映射到一個高維空間,并在高維空間中構造最優(yōu)識別超平面分類數據,實現通信信號的調制識別。
For this problem , a separating hyperplane is designed with the principle of maximizing the distance between two class centers , and a novel support vector machine , called maximal class - center margin support vector machine ( mccm - svm ) is designed 為了解決這個問題,本文以兩個類中心距離最大為準則建立分類超平面,構造一個新的支持向量機,稱作類中心最大間隔支持向量機。
The idea is proposed that those increased date , which near the separating hyperplane , is significant for the forming of the new hyperplane , whenever these date are classed by the former hyperplane to test error set berr or test right set bok 與傳統的增量學習方法不同,本文中,作者認為那些在分類面邊緣增加的數據對分類面的改變都起著重要的作用,無論這些數據被初碩士論文支持向量機在圖像處理應用中若干問題研究始分類器p劃分到測試錯誤集berr或者測試正確集b 。
By mapping input data into a high dimensional characteristic space in which an optimal separating hyperplane is built , svm presents a lot of advantages for resolving the small samples , nonlinear and high dimensional pattern recognition , as well as other machine - learning problems such as function fitting Svm的基本思想是通過非線性變換將輸入空間變換到一個高維空間,然后在這個新的空間中求取最優(yōu)分類超平面。它在解決小樣本、非線性及高維模式識別問題中表現出許多特有的優(yōu)勢,并能夠推廣應用到函數擬合等其他機器學習問題中。
The separating plane with maximal margin is the optimal separating hyperplane which has good generation ability . to find a optimal separating hyperplane leads to a quadratic programming problem which is a special optimization problem . after optimization all vectors are evaluated a weight . the vector whose weight is not zero is called support vector 而尋找最優(yōu)分類超平面需要解決二次規(guī)劃這樣一個特殊的優(yōu)化問題,通過優(yōu)化,每個向量(樣本)被賦予一個權值,權值不為0的向量稱為支持向量,分類超平面是由支持向量構造的。
Support vector machine is a kind of new general learning machine based on statistical learning theory . in order to solve a complicated classification task , it mapped the vectors from input space to feature space in which a linear separating hyperplane is structured . the margin is the distance between the hyperplane and a hyperplane through the closest points 支持向量機是在統計學習理論基礎上發(fā)展起來的一種通用學習機器,其關鍵的思想是利用核函數把一個復雜的分類任務通過核函數映射使之轉化成一個在高維特征空間中構造線性分類超平面的問題。
The separating hyperplane structured by support vectors . the larger data set in real world demands higher efficiency . decomposition is the first practical method to deal with larger data set . it decomposes the training set to two parts : active and inactive , the active part is called working set 由于現實世界的數據量一般比較大,因此對優(yōu)化的效率要求較高,分解是第一種實用的可處理大數據集的技術,它把訓練集分成固定大小的工作集和非工作集兩部分,每次迭代只解決一個工作集中的子優(yōu)化問題。